Log-log Convergence for Noisy Optimization

نویسندگان

  • Sandra Astete Morales
  • Jialin Liu
  • Olivier Teytaud
چکیده

Weconsider noisy optimization problems,without the assumption of variance vanishing in the neighborhood of the optimum. We show mathematically that simple rules with exponential number of resamplings lead to a log-log convergence rate. In particular, in this case the log of the distance to the optimum is linear on the log of the number of resamplings. As well as with number of resamplings polynomial in the inverse step-size. We show empirically that this convergence rate is obtained also with polynomial number of resamplings. In this polynomial resampling setting, using classical evolution strategies and an ad hoc choice of the number of resamplings, we seemingly get the same rate as those obtained with specific Estimation of Distribution Algorithms designed for noisy setting. We also experiment non-adaptive polynomial resamplings. Compared to the state of the art, our results provide (i) proofs of log-log convergence for evolution strategies (which were not covered by existing results) in the case of objective functions with quadratic expectations and constant noise, (ii) log-log rates also for objective functions with expectation E[f(x)] = ||x − x∗||p, where x∗ represents the optimum (iii) experiments with different parameterizations than those considered in the proof. These results propose some simple revaluation schemes. This paper extends [1].

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Noisy Optimization: Convergence with a Fixed Number of Resamplings

It is known that evolution strategies in continuous domains might not converge in the presence of noise [3,14]. It is also known that, under mild assumptions, and using an increasing number of resamplings, one can mitigate the effect of additive noise [4] and recover convergence. We show new sufficient conditions for the convergence of an evolutionary algorithm with constant number of resamplin...

متن کامل

A Convergence Analysis of Log-Linear Training

Log-linear models are widely used probability models for statistical pattern recognition. Typically, log-linear models are trained according to a convex criterion. In recent years, the interest in log-linear models has greatly increased. The optimization of log-linear model parameters is costly and therefore an important topic, in particular for large-scale applications. Different optimization ...

متن کامل

Further and stronger analogy between sampling and optimization: Langevin Monte Carlo and gradient descent

In this paper, we revisit the recently established theoretical guarantees for the convergence of the Langevin Monte Carlo algorithm of sampling from a smooth and (strongly) log-concave density. We improve the existing results when the convergence is measured in the Wasserstein distance and provide further insights on the very tight relations between, on the one hand, the Langevin Monte Carlo fo...

متن کامل

On the Estimation of Relative Risks via Log Binomial Regression

Given the well known convergence difficulties in fitting log binomial regression with standard GLM software, we implement a direct solution via constrained optimization which avoids the circumventions found in the literature. The use of a log binomial model is motivated by our interest in directly estimating relative risks adjusted for confounders. A Bayesian log binomial regression model is al...

متن کامل

Regret Analysis for Continuous Dueling Bandit

The dueling bandit is a learning framework wherein the feedback information in the learning process is restricted to a noisy comparison between a pair of actions. In this research, we address a dueling bandit problem based on a cost function over a continuous space. We propose a stochastic mirror descent algorithm and show that the algorithm achieves an O( √ T log T )-regret bound under strong ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2013